Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 33
Filter
1.
IEEE Trans Vis Comput Graph ; 30(4): 1942-1955, 2024 Apr.
Article in English | MEDLINE | ID: mdl-37030777

ABSTRACT

This article presents a well-scaling parallel algorithm for the computation of Morse-Smale (MS) segmentations, including the region separators and region boundaries. The segmentation of the domain into ascending and descending manifolds, solely defined on the vertices, improves the computational time using path compression and fully segments the border region. Region boundaries and region separators are generated using a multi-label marching tetrahedra algorithm. This enables a fast and simple solution to find optimal parameter settings in preliminary exploration steps by generating an MS complex preview. It also poses a rapid option to generate a fast visual representation of the region geometries for immediate utilization. Two experiments demonstrate the performance of our approach with speedups of over an order of magnitude in comparison to two publicly available implementations. The example section shows the similarity to the MS complex, the useability of the approach, and the benefits of this method with respect to the presented datasets. We provide our implementation with the paper.

2.
IEEE Trans Vis Comput Graph ; 30(1): 1095-1105, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37878452

ABSTRACT

Comparative visualization of scalar fields is often facilitated using similarity measures such as edit distances. In this paper, we describe a novel approach for similarity analysis of scalar fields that combines two recently introduced techniques: Wasserstein geodesics/barycenters as well as path mappings, a branch decomposition-independent edit distance. Effectively, we are able to leverage the reduced susceptibility of path mappings to small perturbations in the data when compared with the original Wasserstein distance. Our approach therefore exhibits superior performance and quality in typical tasks such as ensemble summarization, ensemble clustering, and temporal reduction of time series, while retaining practically feasible runtimes. Beyond studying theoretical properties of our approach and discussing implementation aspects, we describe a number of case studies that provide empirical insights into its utility for comparative visualization, and demonstrate the advantages of our method in both synthetic and real-world scenarios. We supply a C++ implementation that can be used to reproduce our results.

3.
IEEE Trans Vis Comput Graph ; 30(1): 1085-1094, 2024 Jan.
Article in English | MEDLINE | ID: mdl-37871087

ABSTRACT

Over the last decade merge trees have been proven to support a plethora of visualization and analysis tasks since they effectively abstract complex datasets. This paper describes the ExTreeM-Algorithm: A scalable algorithm for the computation of merge trees via extremum graphs. The core idea of ExTreeM is to first derive the extremum graph G of an input scalar field f defined on a cell complex K, and subsequently compute the unaugmented merge tree of f on G instead of K; which are equivalent. Any merge tree algorithm can be carried out significantly faster on G, since K in general contains substantially more cells than G. To further speed up computation, ExTreeM includes a tailored procedure to derive merge trees of extremum graphs. The computation of the fully augmented merge tree, i.e., a merge tree domain segmentation of K, can then be performed in an optional post-processing step. All steps of ExTreeM consist of procedures with high parallel efficiency, and we provide a formal proof of its correctness. Our experiments, performed on publicly available datasets, report a speedup of up to one order of magnitude over the state-of-the-art algorithms included in the TTK and VTK-m software libraries, while also requiring significantly less memory and exhibiting excellent scaling behavior.

4.
Phys Med Biol ; 68(24)2023 Dec 15.
Article in English | MEDLINE | ID: mdl-37949060

ABSTRACT

Objective.Gradient-based optimization using algorithmic derivatives can be a useful technique to improve engineering designs with respect to a computer-implemented objective function. Likewise, uncertainty quantification through computer simulations can be carried out by means of derivatives of the computer simulation. However, the effectiveness of these techniques depends on how 'well-linearizable' the software is. In this study, we assess how promising derivative information of a typical proton computed tomography (pCT) scan computer simulation is for the aforementioned applications.Approach.This study is mainly based on numerical experiments, in which we repeatedly evaluate three representative computational steps with perturbed input values. We support our observations with a review of the algorithmic steps and arithmetic operations performed by the software, using debugging techniques.Main results.The model-based iterative reconstruction (MBIR) subprocedure (at the end of the software pipeline) and the Monte Carlo (MC) simulation (at the beginning) were piecewise differentiable. However, the observed high density and magnitude of jumps was likely to preclude most meaningful uses of the derivatives. Jumps in the MBIR function arose from the discrete computation of the set of voxels intersected by a proton path, and could be reduced in magnitude by a 'fuzzy voxels' approach. The investigated jumps in the MC function arose from local changes in the control flow that affected the amount of consumed random numbers. The tracking algorithm solves an inherently non-differentiable problem.Significance.Besides the technical challenges of merely applying AD to existing software projects, the MC and MBIR codes must be adapted to compute smoother functions. For the MBIR code, we presented one possible approach for this while for the MC code, this will be subject to further research. For the tracking subprocedure, further research on surrogate models is necessary.


Subject(s)
Protons , Tomography, X-Ray Computed , Computer Simulation , Phantoms, Imaging , Tomography, X-Ray Computed/methods , Software , Algorithms , Monte Carlo Method
5.
Plant J ; 116(4): 974-988, 2023 Nov.
Article in English | MEDLINE | ID: mdl-37818860

ABSTRACT

In modern reproducible, hypothesis-driven plant research, scientists are increasingly relying on research data management (RDM) services and infrastructures to streamline the processes of collecting, processing, sharing, and archiving research data. FAIR (i.e., findable, accessible, interoperable, and reusable) research data play a pivotal role in enabling the integration of interdisciplinary knowledge and facilitating the comparison and synthesis of a wide range of analytical findings. The PLANTdataHUB offers a solution that realizes RDM of scientific (meta)data as evolving collections of files in a directory - yielding FAIR digital objects called ARCs - with tools that enable scientists to plan, communicate, collaborate, publish, and reuse data on the same platform while gaining continuous quality control insights. The centralized platform is scalable from personal use to global communities and provides advanced federation capabilities for institutions that prefer to host their own satellite instances. This approach borrows many concepts from software development and adapts them to fit the challenges of the field of modern plant science undergoing digital transformation. The PLANTdataHUB supports researchers in each stage of a scientific project with adaptable continuous quality control insights, from the early planning phase to data publication. The central live instance of PLANTdataHUB is accessible at (https://git.nfdi4plants.org), and it will continue to evolve as a community-driven and dynamic resource that serves the needs of contemporary plant science.


Subject(s)
Databases as Topic , Information Dissemination , Plants
6.
Phys Med Biol ; 68(19)2023 09 20.
Article in English | MEDLINE | ID: mdl-37652034

ABSTRACT

Objective.Proton therapy is highly sensitive to range uncertainties due to the nature of the dose deposition of charged particles. To ensure treatment quality, range verification methods can be used to verify that the individual spots in a pencil beam scanning treatment fraction match the treatment plan. This study introduces a novel metric for proton therapy quality control based on uncertainties in range verification of individual spots.Approach.We employ uncertainty-aware deep neural networks to predict the Bragg peak depth in an anthropomorphic phantom based on secondary charged particle detection in a silicon pixel telescope designed for proton computed tomography. The subsequently predicted Bragg peak positions, along with their uncertainties, are compared to the treatment plan, rejecting spots which are predicted to be outside the 95% confidence interval. The such-produced spot rejection rate presents a metric for the quality of the treatment fraction.Main results.The introduced spot rejection rate metric is shown to be well-defined for range predictors with well-calibrated uncertainties. Using this method, treatment errors in the form of lateral shifts can be detected down to 1 mm after around 1400 treated spots with spot intensities of 1 × 107protons. The range verification model used in this metric predicts the Bragg peak depth to a mean absolute error of 1.107 ± 0.015 mm.Significance.Uncertainty-aware machine learning has potential applications in proton therapy quality control. This work presents the foundation for future developments in this area.


Subject(s)
Proton Therapy , Uncertainty , Protons , Machine Learning , Neural Networks, Computer
7.
IEEE Trans Vis Comput Graph ; 28(1): 1117-1127, 2022 Jan.
Article in English | MEDLINE | ID: mdl-34591761

ABSTRACT

We present Knowledge Rocks, an implementation strategy and guideline for augmenting visualization systems to knowledge-assisted visualization systems, as defined by the KAVA model. Visualization systems become more and more sophisticated. Hence, it is increasingly important to support users with an integrated knowledge base in making constructive choices and drawing the right conclusions. We support the effective reactivation of visualization software resources by augmenting them with knowledge-assistance. To provide a general and yet supportive implementation strategy, we propose an implementation process that bases on an application-agnostic architecture. This architecture is derived from existing knowledge-assisted visualization systems and the KAVA model. Its centerpiece is an ontology that is able to automatically analyze and classify input data, linked to a database to store classified instances. We discuss design decisions and advantages of the KR framework and illustrate its broad area of application in diverse integration possibilities of this architecture into an existing visualization system. In addition, we provide a detailed case study by augmenting an it-security system with knowledge-assistance facilities.

8.
Acta Oncol ; 60(11): 1413-1418, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34259117

ABSTRACT

BACKGROUND: Proton computed tomography (pCT) and radiography (pRad) are proposed modalities for improved treatment plan accuracy and in situ treatment validation in proton therapy. The pCT system of the Bergen pCT collaboration is able to handle very high particle intensities by means of track reconstruction. However, incorrectly reconstructed and secondary tracks degrade the image quality. We have investigated whether a convolutional neural network (CNN)-based filter is able to improve the image quality. MATERIAL AND METHODS: The CNN was trained by simulation and reconstruction of tens of millions of proton and helium tracks. The CNN filter was then compared to simple energy loss threshold methods using the Area Under the Receiver Operating Characteristics curve (AUROC), and by comparing the image quality and Water Equivalent Path Length (WEPL) error of proton and helium radiographs filtered with the same methods. RESULTS: The CNN method led to a considerable improvement of the AUROC, from 74.3% to 97.5% with protons and from 94.2% to 99.5% with helium. The CNN filtering reduced the WEPL error in the helium radiograph from 1.03 mm to 0.93 mm while no improvement was seen in the CNN filtered pRads. CONCLUSION: The CNN improved the filtering of proton and helium tracks. Only in the helium radiograph did this lead to improved image quality.


Subject(s)
Telescopes , Humans , Image Processing, Computer-Assisted , Monte Carlo Method , Neural Networks, Computer , Phantoms, Imaging , Radiography
9.
IEEE Trans Vis Comput Graph ; 27(8): 3585-3596, 2021 08.
Article in English | MEDLINE | ID: mdl-33929962

ABSTRACT

Contemporary scientific data sets require fast and scalable topological analysis to enable visualization, simplification and interaction. Within this field, parallel merge tree construction has seen abundant recent contributions, with a trend of decentralized, task-parallel or SMP-oriented algorithms dominating in terms of total runtime. However, none of these recent approaches computed complete merge trees on distributed systems, leaving this field to traditional divide & conquer approaches. This article introduces a scalable, parallel and distributed algorithm for merge tree construction outperforming the previously fastest distributed solution by a factor of around three. This is achieved by a task-parallel identification of individual merge tree arcs by growing regions around critical points in the data, without any need for ordered progression or global data structures, based on a novel insight introducing a sufficient local boundary for region growth.

10.
IEEE Trans Vis Comput Graph ; 27(2): 572-582, 2021 Feb.
Article in English | MEDLINE | ID: mdl-33048688

ABSTRACT

This paper describes a localized algorithm for the topological simplification of scalar data, an essential pre-processing step of topological data analysis (TDA). Given a scalar field f and a selection of extrema to preserve, the proposed localized topological simplification (LTS) derives a function g that is close to f and only exhibits the selected set of extrema. Specifically, sub- and superlevel set components associated with undesired extrema are first locally flattened and then correctly embedded into the global scalar field, such that these regions are guaranteed-from a combinatorial perspective-to no longer contain any undesired extrema. In contrast to previous global approaches, LTS only and independently processes regions of the domain that actually need to be simplified, which already results in a noticeable speedup. Moreover, due to the localized nature of the algorithm, LTS can utilize shared-memory parallelism to simplify regions simultaneously with a high parallel efficiency (70%). Hence, LTS significantly improves interactivity for the exploration of simplification parameters and their effect on subsequent topological analysis. For such exploration tasks, LTS brings the overall execution time of a plethora of TDA pipelines from minutes down to seconds, with an average observed speedup over state-of-the-art techniques of up to ×36. Furthermore, in the special case where preserved extrema are selected based on topological persistence, an adapted version of LTS partially computes the persistence diagram and simultaneously simplifies features below a predefined persistence threshold. The effectiveness of LTS, its parallel efficiency, and its resulting benefits for TDA are demonstrated on several simulated and acquired datasets from different application domains, including physics, chemistry, and biomedical imaging.

11.
IEEE Trans Vis Comput Graph ; 26(4): 1638-1649, 2020 Apr.
Article in English | MEDLINE | ID: mdl-31995496

ABSTRACT

Operation technology networks, i.e. hard- and software used for monitoring and controlling physical/industrial processes, have been considered immune to cyber attacks for a long time. A recent increase of attacks in these networks proves this assumption wrong. Several technical constraints lead to approaches to detect attacks on industrial processes using available sensor data. This setting differs fundamentally from anomaly detection in IT-network traffic and requires new visualization approaches adapted to the common periodical behavior in OT-network data. We present a tailored visualization system that utilizes inherent features of measurements from industrial processes to full capacity to provide insight into the data and support triage analysis by laymen and experts. The novel combination of spiral plots with results from anomaly detection was implemented in an interactive system. The capabilities of our system are demonstrated using sensor and actuator data from a real-world water treatment process with introduced attacks. Exemplary analysis strategies are presented. Finally, we evaluate effectiveness and usability of our system and perform an expert evaluation.

12.
IEEE Trans Vis Comput Graph ; 26(1): 249-258, 2020 Jan.
Article in English | MEDLINE | ID: mdl-31581084

ABSTRACT

This work describes an approach for the interactive visual analysis of large-scale simulations, where numerous superlevel set components and their evolution are of primary interest. The approach first derives, at simulation runtime, a specialized Cinema database that consists of images of component groups, and topological abstractions. This database is processed by a novel graph operation-based nested tracking graph algorithm (GO-NTG) that dynamically computes NTGs for component groups based on size, overlap, persistence, and level thresholds. The resulting NTGs are in turn used in a feature-centered visual analytics framework to query specific database elements and update feature parameters, facilitating flexible post hoc analysis.

13.
IEEE Comput Graph Appl ; 39(6): 76-85, 2019.
Article in English | MEDLINE | ID: mdl-31714213

ABSTRACT

In situ visualization is an increasingly important approach for computational science, as it can address limitations on leading edge high-performance computers and also can provide an increased spatio-temporal resolution. However, there are many open research issues with effective in situ processing. This article describes the challenges identified by a recent Dagstuhl Seminar on the topic.

14.
Biochem Cell Biol ; 92(6): 489-98, 2014 Dec.
Article in English | MEDLINE | ID: mdl-24943357

ABSTRACT

Mitochondrial ribosomes of baker's yeast contain at least 78 protein subunits. All but one of these proteins are nuclear-encoded, synthesized on cytosolic ribosomes, and imported into the matrix for biogenesis. The import of matrix proteins typically relies on N-terminal mitochondrial targeting sequences that form positively charged amphipathic helices. Interestingly, the N-terminal regions of many ribosomal proteins do not closely match the characteristics of matrix targeting sequences, suggesting that the import processes of these proteins might deviate to some extent from the general import route. So far, the biogenesis of only two ribosomal proteins, Mrpl32 and Mrp10, was studied experimentally and indeed showed surprising differences to the import of other preproteins. In this review article we summarize the current knowledge on the transport of proteins into the mitochondrial matrix, and thereby specifically focus on proteins of the mitochondrial ribosome.


Subject(s)
Mitochondria/metabolism , Ribosomal Proteins/metabolism , Saccharomyces cerevisiae Proteins/metabolism , Saccharomyces cerevisiae/metabolism , Mitochondria/genetics , Protein Transport/physiology , Ribosomal Proteins/genetics , Saccharomyces cerevisiae/genetics , Saccharomyces cerevisiae Proteins/genetics
15.
IEEE Trans Vis Comput Graph ; 20(12): 2684-93, 2014 Dec.
Article in English | MEDLINE | ID: mdl-26356982

ABSTRACT

Topological and structural analysis of multivariate data is aimed at improving the understanding and usage of such data through identification of intrinsic features and structural relationships among multiple variables. We present two novel methods for simplifying so-called Pareto sets that describe such structural relationships. Such simplification is a precondition for meaningful visualization of structurally rich or noisy data. As a framework for simplification operations, we introduce a decomposition of the data domain into regions of equivalent structural behavior and the reachability graph that describes global connectivity of Pareto extrema. Simplification is then performed as a sequence of edge collapses in this graph; to determine a suitable sequence of such operations, we describe and utilize a comparison measure that reflects the changes to the data that each operation represents. We demonstrate and evaluate our methods on synthetic and real-world examples.

16.
IEEE Trans Vis Comput Graph ; 19(12): 2743-52, 2013 Dec.
Article in English | MEDLINE | ID: mdl-24051841

ABSTRACT

Sets of simulation runs based on parameter and model variation, so-called ensembles, are increasingly used to model physical behaviors whose parameter space is too large or complex to be explored automatically. Visualization plays a key role in conveying important properties in ensembles, such as the degree to which members of the ensemble agree or disagree in their behavior. For ensembles of time-varying vector fields, there are numerous challenges for providing an expressive comparative visualization, among which is the requirement to relate the effect of individual flow divergence to joint transport characteristics of the ensemble. Yet, techniques developed for scalar ensembles are of little use in this context, as the notion of transport induced by a vector field cannot be modeled using such tools. We develop a Lagrangian framework for the comparison of flow fields in an ensemble. Our techniques evaluate individual and joint transport variance and introduce a classification space that facilitates incorporation of these properties into a common ensemble visualization. Variances of Lagrangian neighborhoods are computed using pathline integration and Principal Components Analysis. This allows for an inclusion of uncertainty measurements into the visualization and analysis approach. Our results demonstrate the usefulness and expressiveness of the presented method on several practical examples.


Subject(s)
Computer Graphics , Imaging, Three-Dimensional/methods , Models, Theoretical , Numerical Analysis, Computer-Assisted , Rheology/methods , Subtraction Technique , User-Computer Interface , Algorithms , Models, Statistical
17.
IEEE Trans Vis Comput Graph ; 19(10): 1687-99, 2013 Oct.
Article in English | MEDLINE | ID: mdl-23929848

ABSTRACT

Multifluid simulations often create volume fraction data, representing fluid volumes per region or cell of a fluid data set. Accurate and visually realistic extraction of fluid boundaries is a challenging and essential task for efficient analysis of multifluid data. In this work, we present a new material interface reconstruction method for such volume fraction data. Within each cell of the data set, our method utilizes a gradient field approximation based on trilinearly blended Coons-patches to generate a volume fraction function, representing the change in volume fractions over the cells. A continuously varying isovalue field is applied to this function to produce a smooth interface that preserves the given volume fractions well. Further, the method allows user-controlled balance between volume accuracy and physical plausibility of the interface. The method works on two- and three-dimensional Cartesian grids, and handles multiple materials. Calculations are performed locally and utilize only the one-ring of cells surrounding a given cell, allowing visualizations of the material interfaces to be easily generated on a GPU or in a large-scale distributed parallel environment. Our results demonstrate the robustness, accuracy, and flexibility of the developed algorithms.

18.
IEEE Trans Vis Comput Graph ; 19(9): 1579-91, 2013 Sep.
Article in English | MEDLINE | ID: mdl-23846101

ABSTRACT

Characterizing the interplay between the vortices and forces acting on a wind turbine's blades in a qualitative and quantitative way holds the potential for significantly improving large wind turbine design. This paper introduces an integrated pipeline for highly effective wind and force field analysis and visualization. We extract vortices induced by a turbine's rotation in a wind field, and characterize vortices in conjunction with numerically simulated forces on the blade surfaces as these vortices strike another turbine's blades downstream. The scientifically relevant issue to be studied is the relationship between the extracted, approximate locations on the blades where vortices strike the blades and the forces that exist in those locations. This integrated approach is used to detect and analyze turbulent flow that causes local impact on the wind turbine blade structure. The results that we present are based on analyzing the wind and force field data sets generated by numerical simulations, and allow domain scientists to relate vortex-blade interactions with power output loss in turbines and turbine life expectancy. Our methods have the potential to improve turbine design to save costs related to turbine operation and maintenance.

19.
IEEE Trans Vis Comput Graph ; 18(8): 1368-80, 2012 Aug.
Article in English | MEDLINE | ID: mdl-22291157

ABSTRACT

In this paper, we present a novel technique that allows for the coupled computation and visualization of salient flow structures at interactive frame rates. Our approach is built upon a hierarchical representation of the Finite-time Lyapunov Exponent (FTLE) field, which is adaptively sampled and rendered to meet the need of the current visual setting. The performance of our method allows the user to explore large and complex data sets across scales and to inspect their features at arbitrary resolution. The paper discusses an efficient implementation of this strategy on graphics hardware and provides results for an analytical flow and several CFD simulation data sets.

20.
IEEE Trans Vis Comput Graph ; 18(6): 966-77, 2012 Jun.
Article in English | MEDLINE | ID: mdl-21519102

ABSTRACT

Many flow visualization techniques, especially integration-based methods, are problematic when the measured data exhibit noise and discretization issues. Particularly, this is the case for flow-sensitive phase-contrast magnetic resonance imaging (PC-MRI) data sets which not only record anatomic information, but also time-varying flow information. We propose a novel approach for the visualization of such data sets using integration-based methods. Our ideas are based upon finite-time Lyapunov exponents (FTLE) and enable identification of vessel boundaries in the data as high regions of separation. This allows us to correctly restrict integration-based visualization to blood vessels. We validate our technique by comparing our approach to existing anatomy-based methods as well as addressing the benefits and limitations of using FTLE to restrict flow. We also discuss the importance of parameters, i.e., advection length and data resolution, in establishing a well-defined vessel boundary. We extract appropriate flow lines and surfaces that enable the visualization of blood flow within the vessels. We further enhance the visualization by analyzing flow behavior in the seeded region and generating simplified depictions.


Subject(s)
Computer Graphics , Image Processing, Computer-Assisted/methods , Magnetic Resonance Angiography/methods , Models, Cardiovascular , Algorithms , Blood Flow Velocity , Humans , Thorax/anatomy & histology , Thorax/blood supply
SELECTION OF CITATIONS
SEARCH DETAIL
...